Support recovery without incoherence: A case for nonconvex regularization

نویسندگان

  • Po-Ling Loh
  • Martin J. Wainwright
چکیده

We develop a new primal-dual witness proof framework that may be used to establish variable selection consistency and ∞-bounds for sparse regression problems, even when the loss function and regularizer are nonconvex. We use this method to prove two theorems concerning support recovery and ∞-guarantees for a regression estimator in a general setting. Notably, our theory applies to all potential stationary points of the objective and certifies that the stationary point is unique under mild conditions. Our results provide a strong theoretical justification for the use of nonconvex regularization: For certain nonconvex regularizers with vanishing derivative away from the origin, any stationary point can be used to recover the support without requiring the typical incoherence conditions present in 1-based methods. We also derive corollaries illustrating the implications of our theorems for composite objective functions involving losses such as least squares, nonconvex modified least squares for errors-in-variables linear regression, the negative log likelihood for generalized linear models and the graphical Lasso. We conclude with empirical studies that corroborate our theoretical predictions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimal Computational and Statistical Rates of Convergence for Sparse Nonconvex Learning Problems.

We provide theoretical analysis of the statistical and computational properties of penalized M-estimators that can be formulated as the solution to a possibly nonconvex optimization problem. Many important estimators fall in this category, including least squares regression with nonconvex regularization, generalized linear models with nonconvex regularization and sparse elliptical random design...

متن کامل

Efficient ℓq Minimization Algorithms for Compressive Sensing Based on Proximity Operator

This paper considers solving the unconstrained lq-norm (0 ≤ q < 1) regularized least squares (lq-LS) problem for recovering sparse signals in compressive sensing. We propose two highly efficient first-order algorithms via incorporating the proximity operator for nonconvex lq-norm functions into the fast iterative shrinkage/thresholding (FISTA) and the alternative direction method of multipliers...

متن کامل

Robust sparse recovery for compressive sensing in impulsive noise using ℓp-norm model fitting

This work considers the robust sparse recovery problem in compressive sensing (CS) in the presence of impulsive measurement noise. We propose a robust formulation for sparse recovery using the generalized `p-norm with 0 < p < 2 as the metric for the residual error under `1-norm regularization. An alternative direction method (ADM) has been proposed to solve this formulation efficiently. Moreove...

متن کامل

Convergence Results on Proximal Method of Multipliers in Nonconvex Programming

We describe a primal-dual application of the proximal point algorithm to nonconvex minimization problems. Motivated by the work of Spingarn and more recently by the work of Kaplan and Tichatschke about the proximal point methodology in nonconvex optimization. This paper discusses some local results in two directions. The first one concerns the application of the proximal method of multipliers t...

متن کامل

Implicit Regularization in Nonconvex Statistical Estimation: Gradient Descent Converges Linearly for Phase Retrieval, Matrix Completion and Blind Deconvolution

Recent years have seen a flurry of activities in designing provably efficient nonconvex procedures for solving statistical estimation problems. Due to the highly nonconvex nature of the empirical loss, stateof-the-art procedures often require proper regularization (e.g. trimming, regularized cost, projection) in order to guarantee fast convergence. For vanilla procedures such as gradient descen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1412.5632  شماره 

صفحات  -

تاریخ انتشار 2014